Quickest Change Detection and Kullback-Leibler Divergence for Two-State Hidden Markov Models

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quickest Change Detection in Hidden Markov Models for Sensor Networks

The decentralized quickest change detection problem is studied in sensor networks, where a set of sensors receive observations from a hidden Markov model (HMM) X and send sensor messages to a central processor, called the fusion center, which makes a final decision when observations are stopped. It is assumed that the parameter θ in the hidden Markov model for X changes from θ0 to θ1 at some un...

متن کامل

Rényi Divergence and Kullback-Leibler Divergence

Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...

متن کامل

Markov-switching model selection using Kullback–Leibler divergence

In Markov-switching regression models, we use Kullback–Leibler (KL) divergence between the true and candidate models to select the number of states and variables simultaneously. Specifically, we derive a new information criterion, Markov switching criterion (MSC), which is an estimate of KL divergence. MSC imposes an appropriate penalty to mitigate the overretention of states in the Markov chai...

متن کامل

Exact Computation of Kullback-Leibler Distance for Hidden Markov Trees and Models

We suggest new recursive formulas to compute the exact value of the Kullback-Leibler distance (KLD) between two general Hidden Markov Trees (HMTs). For homogeneous HMTs with regular topology, such as homogeneous Hidden Markov Models (HMMs), we obtain a closed-form expression for the KLD when no evidence is given. We generalize our recursive formulas to the case of HMMs conditioned on the observ...

متن کامل

Dysarthric Speech Recognition Using Kullback-Leibler Divergence-Based Hidden Markov Model

Dysarthria is a neuro-motor speech disorder that impedes the physical production of speech. Patients with dysarthria often have trouble in pronouncing certain sounds, resulting in undesirable phonetic variation. Current automatic speech recognition systems designed for the general public are ineffective for dysarthric sufferers due to the phonetic variation. In this paper, we investigate dysart...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Signal Processing

سال: 2015

ISSN: 1053-587X,1941-0476

DOI: 10.1109/tsp.2015.2447506